2,042 research outputs found

    A User study on visualization of agent migration between two companion robots

    Get PDF
    In order to provide continuous user assistance in different physical situations and circumstances, it is desirable that an agent can maintain its identity as it migrates between different physical embodiments. A user study was conducted, with 21 primary school students which investigated the use of three different visual cues to support the user's belief that they are still interacting with the same agent migrating between different robotic embodiments.Non peer reviewe

    Integrating Constrained Experiments in Long-term Human-Robot Interaction using Task– and Scenario–based Prototyping

    Get PDF
    © 2015 The Author(s). Published with license by Taylor & Francis© Dag Sverre Syrdal, Kerstin Dautenhahn, Kheng Lee Koay, and Wan Ching Ho. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/3.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. The moral rights of the named author(s) have been asserted. Permission is granted subject to the terms of the License under which the work was published. Please check the License conditions for the work which you wish to reuse. Full and appropriate attribution must be given. This permission does not cover any third party copyrighted material which may appear in the work requested.In order to investigate how the use of robots may impact everyday tasks, 12 participants interacted with a University of Hertfordshire Sunflower robot over a period of 8 weeks in the university’s Robot House.. Participants performed two constrained tasks, one physical and one cognitive , 4 times over this period. Participant responses were recorded using a variety of measures including the System Usability Scale and the NASA Task Load Index . The use of the robot had an impact on the experienced workload of the participants diïŹ€erently for the two tasks, and this eïŹ€ect changed over time. In the physical task, there was evidence of adaptation to the robot’s behaviour. For the cognitive task, the use of the robot was experienced as more frustrating in the later weeks.Peer reviewedFinal Published versio

    Comparing human robot interaction scenarios using live and video based methods: towards a novel methodological approach

    Get PDF
    This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.---- Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. DOI : 10.1109/AMC.2006.1631754This paper presents results of a pilot study that investigated whether people’s perceptions from live and video HRI trials were comparable. Subjects participated in a live HRI trial and videotaped HRI trials in which the scenario for both trials was identical, and involved a robot fetching an object using different approach directions. Results of the trials indicated moderate to high levels of agreement for subjects’ preferences, and opinions for both the live and video based HRI trials. This methodology is in its infancy and should not be seen as a replacement for live trials. However, our results indicate that for certain HRI scenarios videotaped trials do have potential as a technique for prototyping, testing, developing HRI scenarios, and testing methodologies for use in definitive live trials

    Video prototyping of dog-inspired non-verbal affective communication for an appearance constrained robot

    Get PDF
    Original article can be found at: http://ieeexplore.ieee.org “This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder." “Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.”This paper presents results from a video human-robot interaction (VHRI) study in which participants viewed a video in which an appearance-constrained Pioneer robot used dog-inspired affective cues to communicate affinity and relationship with its owner and a guest using proxemics, body movement and orientation and camera orientation. The findings suggest that even with the limited modalities for non-verbal expression offered by a Pioneer robot, which does not have a dog-like appearance, these cues were effective for non-verbal affective communication

    In good company? : Perception of movement synchrony of a non-anthropomorphic robot

    Get PDF
    Copyright: © 2015 Lehmann et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Recent technological developments like cheap sensors and the decreasing costs of computational power have brought the possibility of robotic home companions within reach. In order to be accepted it is vital for these robots to be able to participate meaningfully in social interactions with their users and to make them feel comfortable during these interactions. In this study we investigated how people respond to a situation where a companion robot is watching its user. Specifically, we tested the effect of robotic behaviours that are synchronised with the actions of a human. We evaluated the effects of these behaviours on the robot’s likeability and perceived intelligence using an online video survey. The robot used was Care-O-botÂź3, a non-anthropomorphic robot with a limited range of expressive motions. We found that even minimal, positively synchronised movements during an object-oriented task were interpreted by participants as engagement and created a positive disposition towards the robot. However, even negatively synchronised movements of the robot led to more positive perceptions of the robot, as compared to a robot that does not move at all. The results emphasise a) the powerful role that robot movements in general can have on participants’ perception of the robot, and b) that synchronisation of body movements can be a powerful means to enhance the positive attitude towards a non-anthropomorphic robot.Peer reviewe

    Exploring the Design Space of Robot Appearance and Behavior in an Attention-Seeking Living Room Scenario for a Robot Companion

    Get PDF
    This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author's copyright. In most cases, these works may not be reposted without the explicit permission of the copyright holder.---- Copyright IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE. --DOI : 10.1109/ALIFE.2007.36781

    What is a robot companion - friend, assistant or butler?

    Get PDF
    The study presented in this paper explored people's perceptions and attitudes towards the idea of a future robot companion for the home. A human-centred approach was adopted using questionnaires and human-robot interaction trials to derive data from 28 adults. Results indicated that a large proportion of participants were in favour of a robot companion and saw the potential role as being an assistant, machine or servant. Few wanted a robot companion to be a friend. Household tasks were preferred to child/animal care tasks. Humanlike communication was desirable for a robot companion, whereas humanlike behaviour and appearance were less essential. Results are discussed in relation to future research directions for the development of robot companions

    An empirical framework for human-robot proxemics

    Get PDF
    The work described in this paper was conducted within the EU Integrated Projects COGNIRON ("The Cognitive Robot Companion") and LIREC (LIving with Robots and intEractive Companions) and was funded by the European Commission under contract numbers FP6- 002020 and FP7-215554.An empirical framework for Human-Robot (HR) proxemics is proposed which shows how the measurement and control of interpersonal distances between a human and a robot can be potentially used by the robot to interpret, predict and manipulate proxemic behaviour for Human-Robot Interactions (HRIs). The proxemic framework provides for incorporation of inter-factor effects, and can be extended to incorporate new factors, updated values and results. The framework is critically discussed and future work proposed

    Formal verification of an autonomous personal robotic assistant

    Get PDF
    Human–robot teams are likely to be used in a variety of situations wherever humans require the assistance of robotic systems. Obvious examples include healthcare and manufacturing, in which people need the assistance of machines to perform key tasks. It is essential for robots working in close proximity to people to be both safe and trustworthy. In this paper we examine formal verification of a high-level planner/scheduler for autonomous personal robotic assistants such as Care-O-bot ℱ . We describe how a model of Care-O-bot and its environment was developed using Brahms, a multiagent workflow language. Formal verification was then carried out by translating this to the input language of an existing model checker. Finally we present some formal verification results and describe how these could be complemented by simulation-based testing and realworld end-user validation in order to increase the practical and perceived safety and trustworthiness of robotic assistants
    • 

    corecore